2,762 research outputs found

    Is the Sunyaev-Zeldovich effect responsible for the observed steepening in the spectrum of the Coma radio halo ?

    Full text link
    The spectrum of the radio halo in the Coma cluster is measured over almost two decades in frequency. The current radio data show a steepening of the spectrum at higher frequencies, which has implications for models of the radio halo origin. There is an on-going debate on the possibility that the observed steepening is not intrinsic to the emitted radiation, but is instead caused by the SZ effect. Recently, the Planck satellite measured the SZ signal and its spatial distribution in the Coma cluster allowing to test this hypothesis. Using the Planck results, we calculated the modification of the radio halo spectrum by the SZ effect in three different ways. With the first two methods we measured the SZ-decrement within the aperture radii used for flux measurements of the halo at the different frequencies. First we adopted the global compilation of data from Thierbach et al. and a reference aperture radius consistent with those used by the various authors. Second we used the available brightness profiles of the halo at different frequencies to derive the spectrum within two fixed apertures, and derived the SZ-decrement using these apertures. As a third method we used the quasi-linear correlation between the y and the radio-halo brightness at 330 MHz discovered by Planck to derive the modification of the radio spectrum by the SZ-decrement in a way that is almost independent of the adopted aperture radius. We found that the spectral modification induced by the SZ-decrement is 4-5 times smaller than that necessary to explain the observed steepening. Consequently a break or cut-off in the spectrum of the emitting electrons is necessary to explain current data. We also show that, if a steepening is absent from the emitted spectrum, future deep observations at 5 GHz with single dishes are expected to measure a halo flux in a 40 arcmin radius that would be 7-8 times higher than currently seen.Comment: 8 pages, 6 figures, accepted in Astronomy and Astrophysics (date of acceptance 19/08/2013

    A Hardy-type inequality and some spectral characterizations for the Dirac-Coulomb operator

    Get PDF
    We prove a sharp Hardy-type inequality for the Dirac operator. We exploit this inequality to obtain spectral properties of the Dirac operator perturbed with Hermitian matrix-valued potentials V of Coulomb type: we characterise its eigenvalues in terms of the Birman–Schwinger principle and we bound its discrete spectrum from below, showing that the ground-state energy is reached if and only if V verifies some rigidity conditions. In the particular case of an electrostatic potential, these imply that V is the Coulomb potential

    A BIM Template for Construction Site Planning

    Get PDF
    Building Information Modelling is gradually becoming the standard method for building design all over the world. Its rapid development is visible not only in the many researches carried on it but also in the several standards released in different countries. The spread of the method implied continuous software improvements with the aim to comply as much as possible different design needs. Nevertheless, an insufficiency of tools specifically developed for construction site planning is still detectable among BIM panorama. The principal aim of the presented research is then to develop BIM use for making more efficient construction site design. Having defined, in an early report, the structure and the contents of the postulated Construction Site Information Model, the research goes on by customizing the available tools in order to fit the needs of a construction site designer. One of these tools is a predetermined template, useful as a starting point for the design, as it is for other design disciplines. The aim is to have at disposal, since the beginning of the project, a model completed of a series of elements, parameters, visualization tools and many other issues able to satisfy the needs of construction site design in term of information contents, level of detail and model efficiency. A step-by-step procedure is also provided to assure the correct use and guarantee the completeness of the model. In particular the research steps has been the following: (i) analysis of some software to evaluate the chances of customizing templates; (ii) creation of the template according to the defined contents and aims of the Construction Site Model; (iii) test and improvement of the tool in a project simulator specifically created for the purpose; (iv) practice in real case study and evaluation about its operation. The case study permits to evaluate how this tool make more efficient site designer task in term of time spent and mistakes avoided

    Fast and Accurate Error Simulation for CNNs Against Soft Errors

    Get PDF
    The great quest for adopting AI-based computation for safety-/mission-critical applications motivates the interest towards methods for assessing the robustness of the application w.r.t. not only its training/tuning but also errors due to faults, in particular soft errors, affecting the underlying hardware. Two strategies exist: architecture-level fault injection and application-level functional error simulation. We present a framework for the reliability analysis of Convolutional Neural Networks (CNNs) via an error simulation engine that exploits a set of validated error models extracted from a detailed fault injection campaign. These error models are defined based on the corruption patterns of the output of the CNN operators induced by faults and bridge the gap between fault injection and error simulation, exploiting the advantages of both approaches. We compared our methodology against SASSIFI for the accuracy of functional error simulation w.r.t. fault injection, and against TensorFI in terms of speedup for the error simulation strategy. Experimental results show that our methodology achieves about 99% accuracy of the fault effects w.r.t. SASSIFI, and a speedup ranging from 44x up to 63x w.r.t. TensorFI, that only implements a limited set of error models

    A giant radio halo in the low luminosity X-ray cluster Abell 523

    Full text link
    Radio halos are extended and diffuse non-thermal radio sources present at the cluster center, not obviously associated with any individual galaxy. A strong correlation has been found between the cluster X-ray luminosity and the halo radio power. We observe and analyze the diffuse radio emission present in the complex merging structure Abell 523, classified as a low luminosity X-ray cluster, to discuss its properties in the context of the halo total radio power versus X-ray luminosity correlation. We reduced VLA archive observations at 1.4 GHz to derive a deep radio image of the diffuse emission, and compared radio, optical, and X-ray data. Low-resolution VLA images detect a giant radio halo associated with a complex merging region. The properties of this new halo agree with those of radio halos in general discussed in the literature, but its radio power is about a factor of ten higher than expected on the basis of the cluster X-ray luminosity. Our study of this giant radio source demonstrates that radio halos can also be present in clusters with a low X-ray luminosity. Only a few similar cases have so far been found . This result suggests that this source represent a new class of objects, that cannot be explained by classical radio halo models. We suggest that the particle reacceleration related to merging processes is very efficient and/or the X-ray luminosity is not a good indicator of the past merging activity of a cluster.Comment: 5 pages, 6 figures, Astronomy and Astrophysics Letter in pres

    Selective Hardening of CNNs based on Layer Vulnerability Estimation

    Get PDF
    There is an increasing interest in employing Convolutional Neural Networks (CNNs) in safety-critical application fields. In such scenarios, it is vital to ensure that the application fulfills the reliability requirements expressed by customers and design standards. On the other hand, given the CNNs extremely high computational requirements, it is also paramount to achieve high performance. To meet both reliability and performance requirements, partial and selective replication of the layers of the CNN can be applied. In this paper, we identify the most critical layers of a CNN in terms of vulnerability to fault and selectively duplicate them to achieve a target reliability vs. execution time trade-off. To this end we perform a design space exploration to identify layers to be duplicated. Results on the application of the proposed approach to four case study CNNs are reported

    SAFETY AND HEALTH SITE INSPECTIONS FOR ON-FIELD RISK ANALYSIS AND TRAINING

    Get PDF
    The field of construction is always affected by a large number of accidents at work that have many different causes and responsible. Therefore, it is of utmost importance to focus on all these issues, in order to reduce all risk factors that can undermine individuals’ safety on building sites. The objective of the research is then the development of a method for quick on site analysis of all critical issues that can create accidents and identification of the related causes in order to directly provide a correct and focused training identified as the best method to act on the causes to reduce accidents. The research was carried on during construction of the Universal Exhibition of Milan – Expo 2015 – that counted almost 70 contemporary construction sites. To reach the goals further research steps has been followed and in particular: (i) inspections on building sites through all the Expo area; (ii) analysis of the main identified problems; (iii) development of a methodology to quickly identify the cause of problems; (iv) validation of the method through back office analysis of site documents; (v) correct on-site training according to found problem. During the whole construction site, the improvements in criticalities solving have been visible thanks to the focused training. The developed method, carried on in a high-risk environment, is applicable in any other building sites and environment as independent from the boundary conditions of the place

    Cosmic rays and Radio Halos in galaxy clusters : new constraints from radio observations

    Full text link
    Clusters of galaxies are sites of acceleration of charged particles and sources of non-thermal radiation. We report on new constraints on the population of cosmic rays in the Intra Cluster Medium (ICM) obtained via radio observations of a fairly large sample of massive, X-ray luminous, galaxy clusters in the redshift interval 0.2--0.4. The bulk of the observed galaxy clusters does not show any hint of Mpc scale synchrotron radio emission at the cluster center (Radio Halo). We obtained solid upper limits to the diffuse radio emission and discuss their implications for the models for the origin of Radio Halos. Our measurements allow us to derive also a limit to the content of cosmic ray protons in the ICM. Assuming spectral indices of these protons delta =2.1-2.4 and microG level magnetic fields, as from Rotation Measures, these limits are one order of magnitude deeper than present EGRET upper limits, while they are less stringent for steeper spectra and lower magnetic fields.Comment: 14 pages, 5 figures, ApJ Letter, accepte

    Optimizing the Use of Behavioral Locking for High-Level Synthesis

    Get PDF
    The globalization of the electronics supply chain requires effective methods to thwart reverse engineering and IP theft. Logic locking is a promising solution, but there are many open concerns. First, even when applied at a higher level of abstraction, locking may result in significant overhead without improving the security metric. Second, optimizing a security metric is application-dependent and designers must evaluate and compare alternative solutions. We propose a meta-framework to optimize the use of behavioral locking during the high-level synthesis (HLS) of IP cores. Our method operates on chip’s specification (before HLS) and it is compatible with all HLS tools, complementing industrial EDA flows. Our meta-framework supports different strategies to explore the design space and to select points to be locked automatically. We evaluated our method on the optimization of differential entropy, achieving better results than random or topological locking: 1) we always identify a valid solution that optimizes the security metric, while topological and random locking can generate unfeasible solutions; 2) we minimize the number of bits used for locking up to more than 90% (requiring smaller tamper-proof memories); 3) we make better use of hardware resources since we obtain similar overheads but with higher security metric

    Analyzing the Reliability of Alternative Convolution Implementations for Deep Learning Applications

    Get PDF
    Convolution represents the core of Deep Learning (DL) applications, enabling the automatic extraction of features from raw input data. Several implementations of the convolution have been proposed. The impact of these different implementations on the performance of DL applications has been studied. However, no specific reliability-related analysis has been carried out. In this paper, we apply the CLASSES cross-layer reliability analysis methodology for an in-depth study aimed at: i) analyzing and characterizing the effects of Single Event Upsets occurring in Graphics Processing Units while executing the convolution operators; and ii) identifying whether a convolution implementation is more robust than others. The outcomes can then be exploited to tailor better hardening schemes for DL applications to improve reliability and reduce overhead
    • …
    corecore